5 research outputs found

    A dynamic field approach to goal inference and error monitoring for human-robot interaction

    Get PDF
    In this paper we present results of our ongoing research on non-verbal human-robot interaction that is heavily inspired by recent experimental findings about the neuro-cognitive mechanisms supporting joint action in humans. The robot control architecture implements the joint coordination of actions and goals as a dynamic process that integrates contextual cues, shared task knowledge and the predicted outcome of the user’s motor behavior. The architecture is formalized by a coupled system of dynamic neural fields representing a distributed network of local but connected neural populations with specific functionalities. We validate the approach in a task in which a robot and a human user jointly construct a toy ’vehicle’. We show that the context-dependent mapping from action observation onto appropriate complementary actions allows the robot to cope with dynamically changing joint action situations. This includes a basic form of error monitoring and compensation.Fundação para a Ciência e a Tecnologia (FCT) - POCI/V.5/A0119/2005, CONC-REEQ/17/200

    A dynamic field approach to goal inference, error detection and anticipatory action selection in human-robot collaboration

    Get PDF
    In this chapter we present results of our ongoing research on efficient and fluent human-robot collaboration that is heavily inspired by recent experimental findings about the neurocognitive mechanisms supporting joint action in humans. The robot control architecture implements the joint coordination of actions and goals as a dynamic process that integrates contextual cues, shared task knowledge and the predicted outcome of the user's motor behavior. The architecture is formalized as a coupled system of dynamic neural fields representing a distributed network of local but connected neural populations with specific functionalities. We validate the approach in a task in which a robot and a human user jointly construct a toy 'vehicle'. We show that the context-dependent mapping from action observation onto appropriate complementary actions allows the robot to cope with dynamically changing joint action situations. More specifically, the results illustrate crucial cognitive capacities for efficient and successful human-robot collaboration such as goal inference, error detection and anticipatory action selection.FCT grants POCI/V.5/A0119/2005 and CONC-REEQ/17/2001 / fp6-IST2 EU-IP Project JAST (proj. nr. 003747

    Combining goal inference and natural-language dialogue for human-robot joint action

    Get PDF
    We demonstrate how combining the reasoning components from two existing systems designed for human-robot joint action produces an integrated system with greater capabilities than either of the individual systems. One of the systems supports primarily non-verbal interaction and uses dynamic neural fields to infer the user’s goals and to suggest appropriate system responses; the other emphasises natural-language interaction and uses a dialogue manager to process user input and select appropriate system responses. Combining these two methods of reasoning results in a robot that is able to coordinate its actions with those of the user while employing a wide range of verbal and non-verbal communicative actions.(undefined

    The power of prediction: robots that read intentions

    No full text
    Humans are experts in cooperating in a smooth and proactive manner. Action and intention understanding are critical components of efficient joint action. In the context of the EU Integrated Project JAST [16] we have developed an anthropomorphic robot endowed with these cognitive capacities. This project and respective robot (ARoS) is the focus of the video. More specifically, the results illustrate crucial cognitive capacities for efficient and successful human-robot collaboration such as goal inference, error detection and anticipatory action selection. Results were considered one of the ICT "success stories"JAST: Joint-Action Science and Technology” (Ref. IST-2-003747-IP)FCT FCOMP-01-0124-FEDER-022674”
    corecore